35 research outputs found

    Investigating the Role of Auditory Feedback in a Multimodal Biking Experience

    Get PDF

    Movement Kinematics Dynamically Modulates the Rolandic ~ 20-Hz Rhythm During Goal-Directed Executed and Observed Hand Actions

    Get PDF
    First Online: 14 February 2018This study investigates whether movement kinematics modulates similarly the rolandic α and ÎČ rhythm amplitude during executed and observed goal-directed hand movements. It also assesses if this modulation relates to the corticokinematic coherence (CKC), which is the coupling observed between cortical activity and movement kinematics during such motor actions. Magnetoencephalography (MEG) signals were recorded from 11 right-handed healthy subjects while they performed or observed an actor performing the same repetitive hand pinching action. Subjects’ and actor’s forefinger movements were monitored with an accelerometer. Coherence was computed between acceleration signals and the amplitude of α (8–12 Hz) or ÎČ (15–25 Hz) oscillations. The coherence was also evaluated between source-projected MEG signals and their ÎČ amplitude. Coherence was mainly observed between acceleration and the amplitude of ÎČ oscillations at movement frequency within bilateral primary sensorimotor (SM1) cortex with no difference between executed and observed movements. Cross-correlation between the amplitude of ÎČ oscillations at the SM1 cortex and movement acceleration was maximal when acceleration was delayed by ~ 100 ms, both during movement execution and observation. Coherence between source-projected MEG signals and their ÎČ amplitude during movement observation and execution was not significantly different from that during rest. This study shows that observing others’ actions engages in the viewer’s brain similar dynamic modulations of SM1 cortex ÎČ rhythm as during action execution. Results support the view that different neural mechanisms might account for this modulation and CKC. These two kinematic-related phenomena might help humans to understand how observed motor actions are actually performed.Xavier De TiĂšge is Postdoctorate Clinical Master Specialist at the Fonds de la Recherche Scientifique (FRS-FNRS, Brussels, Belgium). This work was supported by the program Attract of Innoviris (Grant 2015-BB2B-10 to Mathieu Bourguignon), the Spanish Ministry of Economy and Competitiveness (Grant PSI2016-77175-P to Mathieu Bourguignon), the Marie SkƂodowska-Curie Action of the European Commission (grant #743562 to Mathieu Bourguignon), a “Brains Back to Brussels” grant to Veikko JousmĂ€ki from the Institut d’Encouragement de la Recherche Scientifique et de l’Innovation de Bruxelles (Brussels, Belgium), European Research Council (Advanced Grant #232946 to Riitta Hari), the Fonds de la Recherche Scientifique (FRS-FNRS, Belgium, Research Credits: J009713), and the Academy of Finland (grants #131483 and #263800). The MEG project at the ULB-HĂŽpital Erasme (Brussels, Belgium) is financially supported by the Fonds Erasme

    Hands help hearing: Facilitatory audiotactile interaction at low sound-intensity levels

    Get PDF
    Auditory and vibrotactile stimuli share similar temporal patterns. A psychophysical experiment was performed to test whether this similarity would lead into an intermodal bias in perception of sound intensity. Nine normal-hearing subjects performed a loudness-matching task of faint tones, adjusting the probe tone to sound equally loud as a reference tone. The task was performed both when the subjects were touching and when they were not touching a tube that vibrated simultaneously with the probe tone. The subjects chose on average 12% lower intensities (p<0.01) for the probe tone when they touched the tube, suggesting facilitatory interaction between auditory and tactile senses in normal-hearing subjects.Peer reviewe

    Audiotactile interactions in temporal perception

    Full text link

    Quantitative Evaluation of Artifact Removal in Real Magnetoencephalogram Signals with Blind Source Separation

    Get PDF
    The magnetoencephalogram (MEG) is contaminated with undesired signals, which are called artifacts. Some of the most important ones are the cardiac and the ocular artifacts (CA and OA, respectively), and the power line noise (PLN). Blind source separation (BSS) has been used to reduce the influence of the artifacts in the data. There is a plethora of BSS-based artifact removal approaches, but few comparative analyses. In this study, MEG background activity from 26 subjects was processed with five widespread BSS (AMUSE, SOBI, JADE, extended Infomax, and FastICA) and one constrained BSS (cBSS) techniques. Then, the ability of several combinations of BSS algorithm, epoch length, and artifact detection metric to automatically reduce the CA, OA, and PLN were quantified with objective criteria. The results pinpointed to cBSS as a very suitable approach to remove the CA. Additionally, a combination of AMUSE or SOBI and artifact detection metrics based on entropy or power criteria decreased the OA. Finally, the PLN was reduced by means of a spectral metric. These findings confirm the utility of BSS to help in the artifact removal for MEG background activity

    Vibration-induced auditory-cortex activation in a congenitally deaf adult

    Get PDF
    AbstractConsiderable changes take place in the number of cerebral neurons, synapses and axons during development, mainly as a result of competition between different neural activities [1–4]. Studies using animals suggest that when input from one sensory modality is deprived early in development, the affected neural structures have the potential to mediate functions for the remaining modalities [5–8]. We now show that similar potential exists in the human auditory system: vibrotactile stimuli, applied on the palm and fingers of a congenitally deaf adult, activated his auditory cortices. The recorded magnetoencephalographic (MEG) signals also indicated that the auditory cortices were able to discriminate between the applied 180 Hz and 250 Hz vibration frequencies. Our findings suggest that human cortical areas, normally subserving hearing, may process vibrotactile information in the congenitally deaf

    Attenuation of Somatosensory Responses to Self-Produced Tactile Stimulation

    No full text
    Sensory stimulation resulting from one's own behavior or the outside world is easily differentiated by healthy persons who are able to predict the sensory consequences of their own actions. This ability has been related to cortical attenuation of activation elicited by self-produced stimulation. To date, however, the neural processes underlying this modulation remain to be elucidated. We therefore recorded whole-scalp magnetoencephalographic (MEG) signals from 10 young adults either when they were touched by another person with a brush or when they touched themselves with the same device. The main MEG responses peaked at the primary somatosensory cortex at 54+/-2 ms. Signals and source strengths were about a fifth weaker to self-produced than external touch. Importantly, attenuation was present in each subject. Control recordings indicated that the suppression was neither caused by hand movements as such nor by visual cues. The very early start of the attenuation already about 30 ms after stimulation onset is in line with the hypothesis of forward mechanisms, based on motor commands, as the basis of differentiation between self-produced and externally produced tactile sensations
    corecore